Boosting Unsupervised Competitive Learning Ensembles
نویسندگان
چکیده
Topology preserving mappings are great tools for data visualization and inspection in large datasets. This research presents a combination of several topology preserving mapping models with some basic classifier ensemble and boosting techniques in order to increase the stability conditions and, as an extension, the classification capabilities of the former. A study and comparison of the performance of some novel and classical ensemble techniques are presented in this paper to test their suitability, both in the fields of data visualization and classification when combined with topology preserving models such as the SOM, ViSOM or ML-SIM.
منابع مشابه
Acceleration technique for boosting classification and its application to face detection
We propose an acceleration technique for boosting classification without any loss of classification accuracy and apply it to a face detection task. In classification task, much effort has been spent on improving the classification accuracy and the computational cost of training. In addition to them, the computational cost of classification itself can be critical in several applications includin...
متن کاملSelfieBoost: A Boosting Algorithm for Deep Learning
We describe and analyze a new boosting algorithm for deep learning called SelfieBoost. Unlike other boosting algorithms, like AdaBoost, which construct ensembles of classifiers, SelfieBoost boosts the accuracy of a single network. We prove a log(1/ ) convergence rate for SelfieBoost under some “SGD success” assumption which seems to hold in practice.
متن کاملBoosting Lite - Handling Larger Datasets and Slower Base Classifiers
In this paper, we examine ensemble algorithms (Boosting Lite and Ivoting) that provide accuracy approximating a single classifier, but which require significantly fewer training examples. Such algorithms allow ensemble methods to operate on very large data sets or use very slow learning algorithms. Boosting Lite is compared with Ivoting, standard boosting, and building a single classifier. Comp...
متن کاملCombining Bagging and Boosting
Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, i...
متن کاملEvaluating Unsupervised Ensembles when applied to Word Sense Induction
Ensembles combine knowledge from distinct machine learning approaches into a general flexible system. While supervised ensembles frequently show great benefit, unsupervised ensembles prove to be more challenging. We propose evaluating various unsupervised ensembles when applied to the unsupervised task of Word Sense Induction with a framework for combining diverse feature spaces and clustering ...
متن کامل